Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Shallow-water platform carbonate δ13C may provide a record of changes in ocean chemistry through time, but early marine diagenesis and local processes can decouple these records from the global carbon cycle. Recent studies of calcium isotopes (δ44/40Ca) in shallow-water carbonates indicate that δ44/40Ca can be altered during early marine diagenesis, implying that δ13C may also potentially be altered. Here, we tested the hypothesis that the platform carbonate δ13C record of the Kinderhookian-Osagean boundary excursion (KOBE), ∼353 m.y. ago, reflects a period of global diagenesis using paired isotopic (δ44/40Ca and clumped isotopes) and trace-element geochemistry from three sections in the United States. There is little evidence for covariation between δ44/40Ca and δ13C during the KOBE. Clumped isotopes from our shallowest section support primarily sediment-buffered diagenesis at relatively low temperatures. We conclude that the δ13C record of the KOBE as recorded in shallow-water carbonate is consistent with a shift in the dissolved inorganic carbon reservoir and that, more generally, ancient shallow-water carbonates can retain records of primary seawater chemistry.more » « less
- 
            Binary neural networks (BNNs) substitute complex arithmetic operations with simple bit-wise operations. The binarized weights and activations in BNNs can drastically reduce memory requirement and energy consumption, making it attractive for edge ML applications with limited resources. However, the severe memory capacity and energy constraints of low-power edge devices call for further reduction of BNN models beyond binarization. Weight pruning is a proven solution for reducing the size of many neural network (NN) models, but the binary nature of BNN weights make it difficult to identify insignificant weights to remove. In this paper, we present a pruning method based on latent weight with layer-level pruning sensitivity analysis which reduces the over-parameterization of BNNs, allowing for accuracy gains while drastically reducing the model size. Our method advocates for a heuristics that distinguishes weights by their latent weights, a real-valued vector used to compute the pseudogradient during backpropagation. It is tested using three different convolutional NNs on the MNIST, CIFAR-10, and Imagenette datasets with results indicating a 33%--46% reduction in operation count, with no accuracy loss, improving upon previous works in accuracy, model size, and total operation count.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
